Candidates 590
$600 / mo
≈ $7200 / year net
Junior Data Engineer
Ukraine · Kyiv · 6 months · Intermediate ·Published today
Ukraine · Kyiv · 6 months · Intermediate ·Published today
$6000 / mo
≈ $72000 / year net
Data Engineer
Uzbekistan · 4 years of experience · Advanced/Fluent ·Published today · In passive search
Uzbekistan · 4 years of experience · Advanced/Fluent ·Published today · In passive search
Modeling relational databases (ER modeling, Relational schema, Normal forms), data modeling 3NF/Dimensional (Star,Snowflake,DataVault)
Python, Django, FastApi, Pandas
Looker/LookML
Metabase,Tableau
Docker, Git (GitHub, Actions), Airflow,
Nginx, ElasticSearch, Terraform, K8s(Helm), ArgoCD
AWS(EKS, Glue, S3, EC2, Lambda, ElasticBeanstalk, IAM, VPC…)
GCP(Beam, PubSub, Instance, GCS, BigQuery, …)
AZURE(Synapse, data factory, datarbicks, adls…)
PySpark(Datarbicks),
OpenTableFormats (Iceberg, Delta, Hudi), FileFormat(Avro,Parquet,ORC)
Storage(S3,ADLS,GCS)
HadoopStreaming(MapReduce), Kafka
==================================
----September 2023 - now ApexBank
ApexBank.uz
Data Engineer
Financial Sector
• Banking
Building DWH with:
1. Apache Spark(PySpark) on k8s for compute
2. Apache Hudi on S3(MinIO) as datalake
3. Greenplum as DWH
4. Clickhouse(or Starrocks.io) as serving layer for BI (still in progress)
5. Airflow for orchestration
6. Airbyte for CDC
---- September 2022 till 2023 _ Itransition
---- Data Engineer role
1. Converting BigQuery queries to Snowflake after two companies merged
2. Doing ELT(dbt) with Snowflake-tasks, and for some SQL using Airflow. It depends on which team has
created a ticket.
3. Preparing semantic layer for Looker, so that Bi engineers(including me) can build dashboards
and do some aggregations.
4. Building Looker reports and attaching them to business-related boards.
Python, Apache Airflow, GCP/AWS, Snowflake/BigQuery, Looker(LookML), SQL, DBT
---- September 2021 — September 2022 _ EPAM Systems
---- Python Engineer role
Extracting and data validation for RSS feeds.
Extending functionalities of existing lambdas.
Python, AWS(Lambda), S3, Elastic Beanstalk, VPC, EC2
---- August 2020 — September 2021 _ AlifTech
---- Data Engineer role
- Improved initially working with MySQL Tableau performance by replacing it with MongoDB.
- Wrote data pipelines using Apache Airflow (including data retrieving from several MySQL
instances, processing them on Pandas, and storing them into MongoDB as data warehouse).
- Automated daily manual jobs with Airflow including sending daily reports to other departments.
- Automated ML model builds by refactoring code into Airflow DAGs and Tasks.
Airflow, python, pandas, FastAPI, Tableau
-- Modeling relational databases (ER modeling, Relational schema, Normal forms), data modeling 3NF/Dimensional (Star/Snowflake)
-- Python, Django, FastApi, Pandas
-- k8s(Helm), Docker, Git (GitHub, Actions), Airflow, Metabase, Nginx, AWS/GCP
-- Hadoop(MapReduce), PySpark(Datarbicks), TableFormats (Iceberg, Delta table), FileFormat(Avro,Parquet,ORC), Storage(S3,ADLS,GCS,HDFS)
$2000 / mo
≈ $24000 / year net
Data Analyst, Power BI Developer
Ukraine · Kyiv · 2 years of experience · Upper-Intermediate ·Published today
Ukraine · Kyiv · 2 years of experience · Upper-Intermediate ·Published today
+ R language scripts for data cleaning and analysis (mostly),
Python data cleaning and analysis (rare),
+ Power BI dashboards with DAX manipulatons,
+ XLSForms data collection support and design
2022-2023:
+ Node.js Express server [OAuth2, auth, API, e-mailing, MongoDB, Swagger | Jira, Git] for SoYommy recipes managing project
Role: a team lead of the back-end team;
+ Full-stack web scrappers for e-commerce: getting prices and product data from specific e-shops and manipulating them in my React app on the front-end.
Node.JS, Cheerio, Websockets, Axios. Role: the only developer.
+ Prozorro tenders tracker: finding and tracking for specific tenders
Node.JS, Express, MongoDB, REST API, React, Redux
The main target for me at this stage is to deep dive into the data analysis field, get experience and find out the right direction for further upgrowth.
$3000 / mo
≈ $36000 / year net
SQL Developer/Data Engineer
Ukraine · 7 years of experience · Intermediate ·Published today
Ukraine · 7 years of experience · Intermediate ·Published today
Stack:
- MS SQL, PostgreSQL, Oracle, Teradata;
- C#/.Net;
- ADO.Net, ASP.Net Core, MudBlazor
- Microstrategy BI, Power Bi;
In my previous roles, my responsibilities encompassed:
Created, maintained, and automated reports using Excel and BI tools;
Created queries, tables, views, procedures;
Prepered datasets for analitics;
Developed C# services to synchronize data between databases, services, and APIs;
Created data factory pipelines;
Engaged with MudBlazor: crafting pages, components, and dialogs within the administrative panel.
$550 / mo
≈ $6600 / year net
Python Developer
Ukraine · Kyiv · 6 months · Advanced/Fluent ·Published today
Ukraine · Kyiv · 6 months · Advanced/Fluent ·Published today
English C2, German B1
Experience:
- Python Engineer (Jan - March 2024)
Participated in a startup of a friend in free time. Created a website & a bot. Studied DSPY & RAG + LLM models basics
- Vendor Relations Specialist (Jan 2021 – Feb 2023)
Solved order issues. Oracle CRM actions
Google certified data engineer & cloud architect
Open to positions, that require other languages. Familiar with Node.js, Go, PHP. Having experience of building on Python, I'll quickly learn any popular languages
$1000 / mo
≈ $12000 / year net
Data Engineer
Ukraine · Lviv · 1 year of experience · Upper-Intermediate ·Published today
Ukraine · Lviv · 1 year of experience · Upper-Intermediate ·Published today
managing databases, ETL procedures, and data presentation; work with
groups, guarantee upkeep of systems, and follow compliance rules.
Utilized Apache Spark and Apache Airflow for analyzing
social media data to extract insights on trends in the
cryptocurrency market. Effectively controlled data
pipelines with Airflow, while Spark enabled the handling of
big data sets in a scalable manner. Utilized methods like
sentiment analysis to comprehend market feelings.
Worked with a group to provide useful information and
tackle obstacles efficiently. Acquired expertise in
distributed computing, workflow automation, and data
analysis through practical experience. as a result
increased the accuracy of currency analysis up to 40%
Energetic Junior Data Engineer skilled in ETL, data modeling, and
database optimization. Experienced in Python and SQL, with a focus
on robust data quality checks and collaborative problem-solving.
$600 / mo
≈ $7200 / year net
Trainee/Junior Data Analyst, Trainee/Junior Data Engineer
Ukraine · Kyiv · 6 months · Upper-Intermediate ·Published today
Ukraine · Kyiv · 6 months · Upper-Intermediate ·Published today
I have already done several small projects there using T-SQL, SSIS and Python. I created databases, generated data and compose of requests to create/change/retrieve data. And during my studying at university I have done several simple data analytics projects on convergence Ukraine and European Union, Eurasian Economic Union, cluster, correlation and regression analysis of the regions of Ukraine, quantitative assessment of the potential of cluster formation in Ukrainian regions, primarily I used Excel, Tableau, SPSS and Statistica.
Additionally I am studying on EPAM Data Analytics Engineering Program and have already completed quite a few specialist educational programmes such as:
1. EPAM Data Analytics Engineering Basics Program.
2. Professional Certificate "Google Data Analytics" by Google
3. SQL courses(DataCamp).
4. Specialization "Python for Everybody" by University of Michigan
5. Specialization "Python 3 Programming" by University of Michigan
6. Specialization "Excel Skills for Business" by Macquarie University
7. Specialization "Microsoft 365 Fundamentals" by Microsoft
8. A half of specialization "Data Science" by IBM
9. Course "Introduction to Data Science in Python" by University of Michigan
10. Course "Data Analysis Using Python" by University of Pennsylvania
11. Course "Technology Entrepreneurship: Lab to Market" by Harvard University
12. Course "Development of an innovative product" by MIPT
13. Specialization "Financial instruments for a private investor" by HSE University
14. Professional Certificate " Customer Experience" by MAMI
Also I completed 60+ DB problems on Leetcode.
$7000 / mo
≈ $84000 / year net
Team/Tech Lead/Architect/Engineering Manager
Poland · 6 years of experience · Upper-Intermediate ·Published today
Poland · 6 years of experience · Upper-Intermediate ·Published today
Lead Engineer (Oct 2022 – now)
• Optimization of existing processes and procedures
• Creating new pipelines,modeles,views and drivers (Django + AWS)
• Creating unit-tests for our drivers(PyTest+python)
• Technical support storage
• Writing and optimization drivers, functions
• Kept project leaders regularly updated with progress, maintaining open, productive communication.
• Managed multiple projects with differing technologies, including delegating tasks, assessing quality and sign off.
Lead Engineer (June 2022 – now)
• Optimization of existing processes and procedures
• Creating new pipelines and drivers (Java + AWS)
• Creating unit-tests for our drivers(java spring+springBoot)
• Technical support storage
• Writing and optimization drivers, functions
• Kept project leaders regularly updated with progress, maintaining open, productive communication.
Software Engineer (April 2021 – May 2022)
• Optimization of existing processes and procedures
• Creating new pipelines and drivers (python+Airflow+spark+AWS)
• Creating unit-tests for our data-drivers(scala)
• Creating a new data-notebooks(spark)
• Technical support storage
• Big Data migration process development
• Writing and optimization views, stored procedures, and functions
Intetics
Software Engineer (September 2020 – April 2021)
• ETL process development
• Preparation of data for building reports
• Data migration process development
• Writing and optimization views, stored procedures, and functions, construction and support of storage
EPAM Systems Inc
Software Engineer (August 2019 – September 2020)
• data collection from various sources: Source DB (Oracle, Postgres); different api (Beamery api,GoogleAnalytics api and other); flat files (exel, csv, json)
• pre-processing of data before loading into the DWH (the storage is built in MS
SQL server)
• creation and configuration of loading processes in the DWH (using SSIS packages and
services, python-scripts, PowerShell-scripts and .NET Application)
• the creation of procedures, functions and scripts for data transformation for their preparation for
reports
• development, support of storage performance
• optimization of existing processes and procedures
GlowByte Consulting
Software Engineer (May 2018 – July 2019)
• ETL process development
• Preparation of data for building reports
• Data migration process development
$6000 / mo
≈ $72000 / year net
Data engineer
Greece · 10 years of experience · Upper-Intermediate ·Published today
Greece · 10 years of experience · Upper-Intermediate ·Published today
• Building Data Lake House;
• Building Data Warehouse;
• Implementation of data quality process;
• Creating ETL processes from different sources;.
• Implementation BI solutions
• Migration data to cloud
• Developed database object: stored procedures, UDF, calculation views, complicated queries.
• Developed Data Bricks notebooks and workflows
• Implemented and supported Spark structured streaming process (Event Hub & Data Bricks)
• Designed and implemented custom log structure (Dynatrace & Azure Log Analytics)
• Deployed and maintained snapshots in Snowflake
• Designed and implemented of ADF pipelines
• Designed Azure Data Lake structure
• Performance tuning
• Troubleshooting
Data migration, validation, modeling, transformation.
Work with streaming data.
$6000 / mo
≈ $72000 / year net
Senior Data Engineer
Poland · 6 years of experience · Advanced/Fluent ·Published today
Poland · 6 years of experience · Advanced/Fluent ·Published today
- Building Data Warehouses in BigQuery: from raw data ingestion to materialized tables and views
- Advanced SQL: Built custom CTEs daily for 3 years
- Python & R as a working language, occasionally Scala Spark
- Data Modelling & Analysis in big EU eCommerce using AWS and BigQuery stack (Athena, Sagemaker with EMR, dbt, dataform, advanced Tableau)
Education:
- Bachelor degree in IT/Cybersecurity
- Coursera: Data Science, Machine Learning
Working 6 years as a Data Analyst and Data Engineer, combined with 7 years as a Marketing professional, endowed me with a broad and deep perspective on leveraging data insights for strategic decision-making and marketing effectiveness.
I enjoy having both deep and wide expertise in Data. Understanding the needs of data consumers made me a better analyst, and feeling data analysts pain points made me a more efficient engineer.
- Data ingestion with Airbyte and GCP Cloud Functions from various APIs into BigQuery
- Created conceptual data model for marketing needs
- Built materialized tables with dbt for use in Looker Studio
2. "Trust your data" project (data QA):
Initiated a "trust your data" sprint in a company after finding deep problems in data quality from a data modelling perspective. As a small group of couple data engineers and analysts, we defined new core concepts definition which showed a 30% increase in the core KPI of my department, both in current and historical data.
3. Learning Scala
Just for fun, I learned Scala to use it with Spark. After a month I wrote a tool that got an SQL query as an input, downloaded data from Presto by chunks, and converted it into CSV or Hyper (for Tableau) files. Before, Data Analysts constantly had problems because of Presto limits. My tool eliminated this problem.